Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
Parasit Vectors ; 17(1): 106, 2024 Mar 04.
Article in English | MEDLINE | ID: mdl-38439081

ABSTRACT

BACKGROUND: Although whole-genome sequencing (WGS) is the preferred genotyping method for most genomic analyses, limitations are often experienced when studying genomes characterized by a high percentage of repetitive elements, high linkage, and recombination deserts. The Asian tiger mosquito (Aedes albopictus), for example, has a genome comprising up to 72% repetitive elements, and therefore we set out to develop a single-nucleotide polymorphism (SNP) chip to be more cost-effective. Aedes albopictus is an invasive species originating from Southeast Asia that has recently spread around the world and is a vector for many human diseases. Developing an accessible genotyping platform is essential in advancing biological control methods and understanding the population dynamics of this pest species, with significant implications for public health. METHODS: We designed a SNP chip for Ae. albopictus (Aealbo chip) based on approximately 2.7 million SNPs identified using WGS data from 819 worldwide samples. We validated the chip using laboratory single-pair crosses, comparing technical replicates, and comparing genotypes of samples genotyped by WGS and the SNP chip. We then used the chip for a population genomic analysis of 237 samples from 28 sites in the native range to evaluate its usefulness in describing patterns of genomic variation and tracing the origins of invasions. RESULTS: Probes on the Aealbo chip targeted 175,396 SNPs in coding and non-coding regions across all three chromosomes, with a density of 102 SNPs per 1 Mb window, and at least one SNP in each of the 17,461 protein-coding genes. Overall, 70% of the probes captured the genetic variation. Segregation analysis found that 98% of the SNPs followed expectations of single-copy Mendelian genes. Comparisons with WGS indicated that sites with genotype disagreements were mostly heterozygotes at loci with WGS read depth < 20, while there was near complete agreement with WGS read depths > 20, indicating that the chip more accurately detects heterozygotes than low-coverage WGS. Sample sizes did not affect the accuracy of the SNP chip genotype calls. Ancestry analyses identified four to five genetic clusters in the native range with various levels of admixture. CONCLUSIONS: The Aealbo chip is highly accurate, is concordant with genotypes from WGS with high sequence coverage, and may be more accurate than low-coverage WGS.


Subject(s)
Aedes , Mosquito Vectors , Humans , Animals , Genotype , Mosquito Vectors/genetics , Heterozygote , Aedes/genetics
2.
PLoS One ; 16(6): e0252579, 2021.
Article in English | MEDLINE | ID: mdl-34086754

ABSTRACT

Young adults entering college experience immense shifts in personal and professional environments. Such a potentially stressful event may trigger multiple psychological and physiological effects. In a repeated-measures longitudinal survey (N = 6 time-points) of first year cohort of residential undergraduate students in India, this study evaluates multiple psychological parameters: PSS14 (Perceived Stress Scale), K10 (distress scale) and positive mood measures, along with salivary cortisol levels. We find that compared to women, men showed significantly lower levels of salivary cortisol and also a decrease in perceived stress (PSS14) and distress (K10) with time. By contrast, women reported similar perceived stress and distress levels over time but had higher cortisol levels at the end of the academic year. Academic stress was reported by the students to be the most important stressor. This study highlights notable gender-/sex-differences in psychological and physiological stress responses and adds a valuable longitudinal dataset from the Indian undergraduate student cohort which is lacking in literature.


Subject(s)
Hydrocortisone/analysis , Stress, Psychological , Students/psychology , Adolescent , Female , Humans , India , Longitudinal Studies , Male , Psychological Distress , Saliva/metabolism , Universities , Young Adult
3.
Ecology ; 95(2): 280-5, 2014 Feb.
Article in English | MEDLINE | ID: mdl-24669722

ABSTRACT

In predator-prey foraging games, predators should respond to variations in prey state. The value of energy for the prey changes depending on season. Prey in a low energetic state and/or in a reproductive state should invest more in foraging and tolerate higher predation risk. This should make the prey more catchable, and thereby, more preferable to predators. We ask, can predators respond to prey state? How does season and state affect the foraging game from the predator's perspective? By letting owls choose between gerbils whose states we experimentally manipulated, we could demonstrate predator sensitivity to prey state and predator selectivity that otherwise may be obscured by the foraging game. During spring, owls invested more time and attacks in the patch with well-fed gerbils. During summer, owls attacked both patches equally, yet allocated more time to the patch with hungry gerbils. Energetic state per se does not seem to be the basis of owl choice. The owls strongly responded to these subtle differences. In summer, gerbils managed their behavior primarily for survival, and the owls equalized capture opportunities by attacking both patches equally.


Subject(s)
Gerbillinae/physiology , Predatory Behavior/physiology , Strigiformes/physiology , Animals , Body Constitution , Food Deprivation , Seasons
4.
PLoS One ; 9(2): e88832, 2014.
Article in English | MEDLINE | ID: mdl-24551171

ABSTRACT

Key to predicting impacts of predation is understanding the mechanisms through which predators impact prey populations. While consumptive effects are well-known, non-consumptive predator effects (risk effects) are increasingly being recognized as important. Studies of risk effects, however, have focused largely on how trade-offs between food and safety affect fitness. Less documented, and appreciated, is the potential for predator presence to directly suppress prey reproduction and affect life-history characteristics. For the first time, we tested the effects of visual predator cues on reproduction of two prey species with different reproductive modes, lecithotrophy (i.e. embryonic development primarily fueled by yolk) and matrotrophy (i.e. energy for embryonic development directly supplied by the mother to the embryo through a vascular connection). Predation risk suppressed reproduction in the lecithotrophic prey (Gambusia holbrokii) but not the matrotroph (Heterandria formosa). Predator stress caused G. holbrooki to reduce clutch size by 43%, and to produce larger and heavier offspring compared to control females. H. formosa, however, did not show any such difference. In G. holbrooki we also found a significantly high percentage (14%) of stillbirths in predator-exposed treatments compared to controls (2%). To the best of our knowledge, this is the first direct empirical evidence of predation stress affecting stillbirths in prey. Our results suggest that matrotrophy, superfetation (clutch overlap), or both decrease the sensitivity of mothers to environmental fluctuation in resource (food) and stress (predation risk) levels compared to lecithotrophy. These mechanisms should be considered both when modeling consequences of perceived risk of predation on prey-predator population dynamics and when seeking to understand the evolution of reproductive modes.


Subject(s)
Cyprinodontiformes/physiology , Killifishes/physiology , Reproduction , Stress, Psychological , Animals , Bass/physiology , Biological Evolution , Clutch Size , Female , Food Chain , Population Dynamics , Predatory Behavior/physiology , Risk
5.
Biol Rev Camb Philos Soc ; 88(3): 550-63, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23331494

ABSTRACT

How foragers balance risks during foraging is a central focus of optimal foraging studies. While diverse theoretical and empirical work has revealed how foragers should and do manage food and safety from predators, little attention has been given to the risks posed by dangerous prey. This is a potentially important oversight because risk of injury can give rise to foraging costs similar to those arising from the risk of predation, and with similar consequences. Here, we synthesize the literature on how foragers manage risks associated with dangerous prey and adapt previous theory to make the first steps towards a framework for future studies. Though rarely documented, it appears that in some systems predators are frequently injured while hunting and risk of injury can be an important foraging cost. Fitness costs of foraging injuries, which can be fatal, likely vary widely but have rarely been studied and should be the subject of future research. Like other types of risk-taking behaviour, it appears that there is individual variation in the willingness to take risks, which can be driven by social factors, experience and foraging abilities, or differences in body condition. Because of ongoing modifications to natural communities, including changes in prey availability and relative abundance as well as the introduction of potentially dangerous prey to numerous ecosystems, understanding the prevalence and consequences of hunting dangerous prey should be a priority for behavioural ecologists.


Subject(s)
Herbivory/physiology , Predatory Behavior/physiology , Animals , Ecosystem
6.
Ecol Lett ; 13(3): 302-10, 2010 Mar.
Article in English | MEDLINE | ID: mdl-20455918

ABSTRACT

Predator-prey interactions are often behaviourally sophisticated games in which the predator and prey are players. Past studies teach us that hungrier prey take higher risks when foraging and that hungrier predators increase their foraging activity and are willing to take higher risks of injury. Yet no study has looked at the simultaneous responses of predator and prey to their own and each other's hunger levels in a controlled environment. We looked for evidence of a state-dependent game between predators and their prey by simultaneously manipulating the hunger state of barn owls, and Allenby's gerbils as prey. The owls significantly increased their activity when hungry. However, they did not appear to respond to changes in the hunger state of the gerbils. The gerbils reacted strongly to the owls' state, as well as to their own state when the risk was perceived as high. Our study shows that predator-prey interactions give rise to a complex state-dependent game.


Subject(s)
Feeding Behavior , Gerbillinae/physiology , Hunger , Predatory Behavior , Strigiformes/physiology , Animals , Games, Experimental
7.
Proc Biol Sci ; 277(1687): 1469-74, 2010 May 22.
Article in English | MEDLINE | ID: mdl-20053649

ABSTRACT

Foraging animals have several tools for managing the risk of predation, and the foraging games between them and their predators. Among these, time allocation is foremost, followed by vigilance and apprehension. Together, their use influences a forager's time allocation and giving-up density (GUD) in depletable resource patches. We examined Allenby's gerbils (Gerbilus andersoni allenbyi) exploiting seed resource patches in a large vivarium under varying moon phases in the presence of a red fox (Vulpes vulpes). We measured time allocated to foraging patches electronically and GUDs from seeds left behind in resource patches. From these, we estimated handling times, attack rates and quitting harvest rates (QHRs). Gerbils displayed greater vigilance (lower attack rates) at brighter moon phases (full < wane < wax < new). Similarly, they displayed higher GUDs at brighter moon phases (wax > full > new > wane). Finally, gerbils displayed higher QHRs at new and waxing moon phases. Differences across moon phases not only reflect changing time allocation and vigilance, but changes in the state of the foragers and their marginal value of energy. Early in the lunar cycle, gerbils rely on vigilance and sacrifice state to avoid risk; later they defend state at the cost of increased time allocation; finally their state can recover as safe opportunities expand. In the predator-prey foraging game, foxes may contribute to these patterns of behaviours by modulating their own activity in response to the opportunities presented in each moon phase.


Subject(s)
Feeding Behavior/physiology , Foxes/physiology , Gerbillinae/physiology , Moon , Animals , Behavior, Animal , Panicum , Predatory Behavior , Risk Factors , Seeds , Time Factors
8.
Behav Ecol Sociobiol ; 63(12): 1821-1827, 2009 Oct.
Article in English | MEDLINE | ID: mdl-19779627

ABSTRACT

Theory states that an optimal forager should exploit a patch so long as its harvest rate of resources from the patch exceeds its energetic, predation, and missed opportunity costs for foraging. However, for many foragers, predation is not the only source of danger they face while foraging. Foragers also face the risk of injuring themselves. To test whether risk of injury gives rise to a foraging cost, we offered red foxes pairs of depletable resource patches in which they experienced diminishing returns. The resource patches were identical in all respects, save for the risk of injury. In response, the foxes exploited the safe patches more intensively. They foraged for a longer time and also removed more food (i.e., had lower giving up densities) in the safe patches compared to the risky patches. Although they never sustained injury, video footage revealed that the foxes used greater care while foraging from the risky patches and removed food at a slower rate. Furthermore, an increase in their hunger state led foxes to allocate more time to foraging from the risky patches, thereby exposing themselves to higher risks. Our results suggest that foxes treat risk of injury as a foraging cost and use time allocation and daring-the willingness to risk injury-as tools for managing their risk of injury while foraging. This is the first study, to our knowledge, which explicitly tests and shows that risk of injury is indeed a foraging cost. While nearly all foragers may face an injury cost of foraging, we suggest that this cost will be largest and most important for predators.

9.
Oecologia ; 159(3): 661-8, 2009 Mar.
Article in English | MEDLINE | ID: mdl-19082629

ABSTRACT

Predator-prey studies often assume a three trophic level system where predators forage free from any risk of predation. Since meso-predators themselves are also prospective prey, they too need to trade-off between food and safety. We applied foraging theory to study patch use and habitat selection by a meso-predator, the red fox. We present evidence that foxes use a quitting harvest rate rule when deciding whether or not to abandon a foraging patch, and experience diminishing returns when foraging from a depletable food patch. Furthermore, our data suggest that patch use decisions of red foxes are influenced not just by the availability of food, but also by their perceived risk of predation. Fox behavior was affected by moonlight, with foxes depleting food resources more thoroughly (lower giving-up density) on darker nights compared to moonlit nights. Foxes reduced risk from hyenas by being more active where and when hyena activity was low. While hyenas were least active during moon, and most active during full moon nights, the reverse was true for foxes. Foxes showed twice as much activity during new moon compared to full moon nights, suggesting different costs of predation. Interestingly, resources in patches with cues of another predator (scat of wolf) were depleted to significantly lower levels compared to patches without. Our results emphasize the need for considering risk of predation for intermediate predators, and also shows how patch use theory and experimental food patches can be used for a predator. Taken together, these results may help us better understand trophic interactions.


Subject(s)
Foxes/physiology , Predatory Behavior , Animals
10.
Ecology ; 88(3): 597-604, 2007 Mar.
Article in English | MEDLINE | ID: mdl-17503587

ABSTRACT

Carrying capacity is one of the most important, yet least understood and rarely estimated, parameters in population management and modeling. A simple behavioral metric of carrying capacity would advance theory, conservation, and management of biological populations. Such a metric should be possible because behavior is finely attuned to variation in environment including population density. We connect optimal foraging theory with population dynamics and life history to develop a simple model that predicts this sort of adaptive density-dependent change in food consumption. We then confirm the model's unexpected and manifold predictions with field experiments. The theory predicts reproductive thresholds that alter the marginal value of energy as well as the value of time. Both effects cause a pronounced discontinuity in quitting-harvest rate that we revealed with foraging experiments. Red-backed voles maintained across a range of high densities foraged at a lower density-dependent rate than the same animals exposed to low-density treatments. The change in harvest rate is diagnostic of populations that exceed their carrying capacity. Ecologists, conservation biologists, and wildlife managers may thus be able to use simple and efficient foraging experiments to estimate carrying capacity and habitat quality.


Subject(s)
Arvicolinae/physiology , Ecosystem , Feeding Behavior/physiology , Models, Theoretical , Population Density , Analysis of Variance , Animals , Ontario , Population Dynamics
SELECTION OF CITATIONS
SEARCH DETAIL
...